This paper describes a fully spike-based neural network for optical flowestimation from Dynamic Vision Sensor data. A low power embedded implementationof the method which combines the Asynchronous Time-based Image Sensor withIBM's TrueNorth Neurosynaptic System is presented. The sensor generates spikeswith sub-millisecond resolution in response to scene illumination changes.These spike are processed by a spiking neural network running on TrueNorth witha 1 millisecond resolution to accurately determine the order and timedifference of spikes from neighboring pixels, and therefore infer the velocity.The spiking neural network is a variant of the Barlow Levick method for opticalflow estimation. The system is evaluated on two recordings for which groundtruth motion is available, and achieves an Average Endpoint Error of 11% at anestimated power budget of under 80mW for the sensor and computation.
展开▼